Online Nonparametric Regression with General Loss Functions

نویسندگان

  • Alexander Rakhlin
  • Karthik Sridharan
چکیده

This paper establishes minimax rates for online regression with arbitrary classes of functions and general losses.1 We show that below a certain threshold for the complexity of the function class, the minimax rates depend on both the curvature of the loss function and the sequential complexities of the class. Above this threshold, the curvature of the loss does not affect the rates. Furthermore, for the case of square loss, our results point to the interesting phenomenon: whenever sequential and i.i.d. empirical entropies match, the rates for statistical and online learning are the same. In addition to the study of minimax regret, we derive a generic forecaster that enjoys the established optimal rates. We also provide a recipe for designing online prediction algorithms that can be computationally efficient for certain problems. We illustrate the techniques by deriving existing and new forecasters for the case of finite experts and for online linear regression.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Asymptotic Equivalence and Adaptive Estimation for Robust Nonparametric Regression

Asymptotic equivalence theory developed in the literature so far are only for bounded loss functions. This limits the potential applications of the theory because many commonly used loss functions in statistical inference are unbounded. In this paper we develop asymptotic equivalence results for robust nonparametric regression with unbounded loss functions. The results imply that all the Gaussi...

متن کامل

On the Robust Modal Local Polynomial Regression

Modal local polynomial regression uses double kernel as the loss function to gain some robustness in the nonparametric regression. Current researches use the standard normal density function as the weight function to down-weigh the influences from the outliers. This paper extends the standard normal weight function to a general class weight functions. All the theoretical properties found by usi...

متن کامل

Nonparametric Online Regression while Learning the Metric

We study algorithms for online nonparametric regression that learn the directions along which the regression function is smoother. Our algorithm learns the Mahalanobis metric based on the gradient outer product matrix G of the regression function (automatically adapting to the effective rank of this matrix), while simultaneously bounding the regret —on the same data sequence— in terms of the sp...

متن کامل

A New Nonparametric Regression for Longitudinal Data

In many area of medical research, a relation analysis between one response variable and some explanatory variables is desirable. Regression is the most common tool in this situation. If we have some assumptions for such normality for response variable, we could use it. In this paper we propose a nonparametric regression that does not have normality assumption for response variable and we focus ...

متن کامل

On the Asymptotic Theory of

For regression analysis, some useful information may have been lost when the responses are right censored. To estimate nonparametric functions, several estimates based on censored data have been proposed and their consistency and convergence rates have been studied in the literature, but the optimal rates of global convergence have not been obtained yet. Because of the possible information loss...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1501.06598  شماره 

صفحات  -

تاریخ انتشار 2015